Skip to content

MkLlm

Show source on GitHub

Node for LLM-based text generation.

Example: Regular

Jinja

{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}

Python

MkLlm('Write a poem about MkDocs', [])

In the realm of code where the beacons glow,
A tool emerges, a guiding flow.
MkDocs, the scribe of the digital age,
Crafting documentation, turning each page.

With Markdown’s grace, it takes its stand,
A syntax so simple, like grains of sand.
Headers and lists, in harmony blend,
A narrative woven, where insights extend.

From project to project, it journeys afar,
Hosting its wisdom like a bright northern star.
Themes that are sleek, and layouts that shine,
Each page a testament, each line so fine.

In GitHub’s embrace, it finds its home,
Collaborative spirit in every tome.
Versioned with care, as code evolves,
A cycle of sharing, where knowledge resolves.

Build it and watch as it takes to the sky,
Static sites flourish, no need to comply.
With every command, a universe grows,
Documenting journeys, as curiosity flows.

So here’s to MkDocs, our digital friend,
Crafting the words that we all want to send.
In the vast web of wisdom, it plays a great role,
Helping us share, connect, and console.

In the realm of code where the beacons glow,  
A tool emerges, a guiding flow.  
MkDocs, the scribe of the digital age,  
Crafting documentation, turning each page.  

With Markdown’s grace, it takes its stand,  
A syntax so simple, like grains of sand.  
Headers and lists, in harmony blend,  
A narrative woven, where insights extend.  

From project to project, it journeys afar,  
Hosting its wisdom like a bright northern star.  
Themes that are sleek, and layouts that shine,  
Each page a testament, each line so fine.  

In GitHub’s embrace, it finds its home,  
Collaborative spirit in every tome.  
Versioned with care, as code evolves,  
A cycle of sharing, where knowledge resolves.  

Build it and watch as it takes to the sky,  
Static sites flourish, no need to comply.  
With every command, a universe grows,  
Documenting journeys, as curiosity flows.  

So here’s to MkDocs, our digital friend,  
Crafting the words that we all want to send.  
In the vast web of wisdom, it plays a great role,  
Helping us share, connect, and console.  
<p>In the realm of code where the beacons glow,<br>
A tool emerges, a guiding flow.<br>
MkDocs, the scribe of the digital age,<br>
Crafting documentation, turning each page.  </p>
<p>With Markdown’s grace, it takes its stand,<br>
A syntax so simple, like grains of sand.<br>
Headers and lists, in harmony blend,<br>
A narrative woven, where insights extend.  </p>
<p>From project to project, it journeys afar,<br>
Hosting its wisdom like a bright northern star.<br>
Themes that are sleek, and layouts that shine,<br>
Each page a testament, each line so fine.  </p>
<p>In GitHub’s embrace, it finds its home,<br>
Collaborative spirit in every tome.<br>
Versioned with care, as code evolves,<br>
A cycle of sharing, where knowledge resolves.  </p>
<p>Build it and watch as it takes to the sky,<br>
Static sites flourish, no need to comply.<br>
With every command, a universe grows,<br>
Documenting journeys, as curiosity flows.  </p>
<p>So here’s to MkDocs, our digital friend,<br>
Crafting the words that we all want to send.<br>
In the vast web of wisdom, it plays a great role,<br>
Helping us share, connect, and console.  </p>

Bases: MkText

text property

text: str

__init__

__init__(
    user_prompt: str,
    system_prompt: str | None = None,
    model: str = "openai:gpt-4o-mini",
    context: str | None = None,
    extra_files: Sequence[str | PathLike[str]] | None = None,
    **kwargs: Any
)

Parameters:

Name Type Description Default
user_prompt str

Main prompt for the LLM

required
system_prompt str | None

System prompt to set LLM behavior

None
model str

LLM model identifier to use

'openai:gpt-4o-mini'
context str | None

Main context string

None
extra_files Sequence[str | PathLike[str]] | None

Additional context files or strings

None
kwargs Any

Keyword arguments passed to parent

{}
Name Children Inherits
MkText
mknodes.basenodes.mktext
Class for any Markup text.
graph TD
  94420313597328["mkllm.MkLlm"]
  94420313246864["mktext.MkText"]
  94420313076768["mknode.MkNode"]
  94420313236736["node.Node"]
  140608527347936["builtins.object"]
  94420313246864 --> 94420313597328
  94420313076768 --> 94420313246864
  94420313236736 --> 94420313076768
  140608527347936 --> 94420313236736
/home/runner/work/mknodes/mknodes/mknodes/templatenodes/mkllm/metadata.toml
[metadata]
icon = "mdi:view-grid"
status = "new"
name = "MkLlm"

[examples.regular]
title = "Regular"
jinja = """
{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}
"""

# [output.markdown]
# template = """
# <div class="grid cards" markdown="1">

# {% for item in node.items %}
# -   {{ item | indent }}
# {% endfor %}
# </div>
# """
mknodes.templatenodes.mkllm.MkLlm
class MkLlm(mktext.MkText):
    """Node for LLM-based text generation."""

    ICON = "material/format-list-group"
    REQUIRED_PACKAGES = [resources.Package("litellm")]

    def __init__(
        self,
        user_prompt: str,
        system_prompt: str | None = None,
        model: str = "openai:gpt-4o-mini",
        context: str | None = None,
        extra_files: Sequence[str | os.PathLike[str]] | None = None,
        **kwargs: Any,
    ):
        """Constructor.

        Args:
            user_prompt: Main prompt for the LLM
            system_prompt: System prompt to set LLM behavior
            model: LLM model identifier to use
            context: Main context string
            extra_files: Additional context files or strings
            kwargs: Keyword arguments passed to parent
        """
        super().__init__(**kwargs)
        self.user_prompt = user_prompt
        self.system_prompt = system_prompt
        self._model = model
        self._context = context
        self._extra_files = extra_files or []

    def _process_extra_files(self) -> list[str]:
        """Process extra context items, reading files if necessary.

        Returns:
            List of context strings.
        """
        context_items: list[str] = []

        def process_dir(path: UPath) -> list[str]:
            return [f.read_text() for f in path.rglob("*") if f.is_file()]

        for item in self._extra_files:
            try:
                path = UPath(item)
                if path.is_file():
                    context_items.append(path.read_text())
                elif path.is_dir():
                    context_items.extend(process_dir(path))
                else:
                    context_items.append(str(item))
            except Exception as exc:
                err_msg = f"Failed to read context file: {item}"
                logger.warning(err_msg)
                raise ValueError(err_msg) from exc

        return context_items

    @property
    def text(self) -> str:
        """Generate text using the LLM.

        Returns:
            Generated text content.
        """
        context_items = self._process_extra_files()
        combined_context = (
            "\n".join(filter(None, [self._context, *context_items])) or None
        )

        return complete_llm(
            self.user_prompt,
            self.system_prompt or "",
            model=self._model,
            context=combined_context or "",
        )